learning erdo-renyi random graph
Reviews: Learning Erdos-Renyi Random Graphs via Edge Detecting Queries
This paper studies a learning task of Erdos-Renyi graph with edge detecting queries. Each query or test is given as a set of input (a set of nodes) X representing a set of nodes and output Y(X) 0 or 1 representing the absence or presence of at lease one edge in the node set X. The authors derive the theoretical and algorithmic bounds of the number of tests required for achieving the perfect reconstruction of the graph in the asymptotic limit where the number of nodes n tends to be infinity, using three known algorithms: COMP, DD, and SSS. The derived analytic form is tight up to the asymptotic constant. A new algorithm is also invented based on GROTESQUE algorithm, achieving a sublinear computational time in the decoding step which is near optimal compared to the derived bounds.
Reviews: Learning Erdos-Renyi Random Graphs via Edge Detecting Queries
The main contributions are threefold: - An algorithm-independent probabilistic lower bound of the number of non-adaptive tests is given via Fano's inequality (Theorem 1). Under the Bernoulli testing, upper bounds are provided for COMP (Theorem 2) and DD (Theorem 3), and a lower bound is provided for SSS (Theorem 4). These in particular show that DD is asymptotically optimal under the Bernoulli testing when \theta 1/2. Although the three review scores exhibited a relatively large split in the initial round of review, after the authors' rebuttal as well as discussion among the reviewers, all the three reviewers have become positive ratings. I would thus like to recommend acceptance of this paper.
Learning Erdos-Renyi Random Graphs via Edge Detecting Queries
In this paper, we consider the problem of learning an unknown graph via queries on groups of nodes, with the result indicating whether or not at least one edge is present among those nodes. While learning arbitrary graphs with n nodes and k edges is known to be hard in the sense of requiring \Omega( \min\{ k 2 \log n, n 2\}) tests (even when a small probability of error is allowed), we show that learning an Erd\H{o}s-R\'enyi random graph with an average of \kbar edges is much easier; namely, one can attain asymptotically vanishing error probability with only O(\kbar \log n) tests. We establish such bounds for a variety of algorithms inspired by the group testing problem, with explicit constant factors indicating a near-optimal number of tests, and in some cases asymptotic optimality including constant factors. In addition, we present an alternative design that permits a near-optimal sublinear decoding time of O(\kbar \log 2 \kbar \kbar \log n) .
Learning Erdos-Renyi Random Graphs via Edge Detecting Queries
Li, Zihan, Fresacher, Matthias, Scarlett, Jonathan
In this paper, we consider the problem of learning an unknown graph via queries on groups of nodes, with the result indicating whether or not at least one edge is present among those nodes. While learning arbitrary graphs with $n$ nodes and $k$ edges is known to be hard in the sense of requiring $\Omega( \min\{ k 2 \log n, n 2\})$ tests (even when a small probability of error is allowed), we show that learning an Erd\H{o}s-R\'enyi random graph with an average of $\kbar$ edges is much easier; namely, one can attain asymptotically vanishing error probability with only $O(\kbar \log n)$ tests. We establish such bounds for a variety of algorithms inspired by the group testing problem, with explicit constant factors indicating a near-optimal number of tests, and in some cases asymptotic optimality including constant factors. In addition, we present an alternative design that permits a near-optimal sublinear decoding time of $O(\kbar \log 2 \kbar \kbar \log n)$. Papers published at the Neural Information Processing Systems Conference.